An efficient method for computing orthogonal discriminant vectors
نویسندگان
چکیده
We propose a linear discriminant analysis method. In this method, every discriminant vector, except for the first one, is worked out by maximizing a Fisher criterion defined in a transformed space which is the null space of the previously obtained discriminant vectors. All of these discriminant vectors are used for dimension reduction. We also propose two algorithms to implement the model. Based on the is not singular. The experimental results show that the proposed method is effective and efficient. & 2010 Elsevier B.V. All rights reserved.
منابع مشابه
A rank-one update algorithm for fast solving kernel Foley-Sammon optimal discriminant vectors
Discriminant analysis plays an important role in statistical pattern recognition. A popular method is the Foley-Sammon optimal discriminant vectors (FSODVs) method, which aims to find an optimal set of discriminant vectors that maximize the Fisher discriminant criterion under the orthogonal constraint. The FSODVs method outperforms the classic Fisher linear discriminant analysis (FLDA) method i...
متن کاملA New Regularized Orthogonal Local Fisher Discriminant Analysis for Image Feature Extraction
Local Fisher Discriminant Analysis (LFDA) is a feature extraction method which combines the ideas of Fisher discriminant analysis (FDA) and locality preserving projection (LPP). It works well for multimodal problems. But LFDA suffers from the under-sampled problem of the linear discriminant analysis (LDA). To deal with this problem, we propose a regularized orthogonal local Fisher discriminant ...
متن کاملOrthogonal Signal Decomposition Coupled with Bayes and Puzzy Discriminant Classifiers for Ultrasonic Flaw Detection
The performance of an ultrasonic flaw detection system is valued by its success in differentiating flaw echoes from those scattered by microstructures (e.g. grain scattering or clutter). In order to successfully detect and classify the target echoes from background noise, an effective feature extracting method and a robust decision process are required. In this study, we present a comparative e...
متن کاملA Multi Linear Discriminant Analysis Method Using a Subtraction Criteria
Linear dimension reduction has been used in different application such as image processing and pattern recognition. All these data folds the original data to vectors and project them to an small dimensions. But in some applications such we may face with data that are not vectors such as image data. Folding the multidimensional data to vectors causes curse of dimensionality and mixed the differe...
متن کاملGeneralized Maximal Margin Discriminant Analysis for Speech Emotion Recognition
A novel speech emotion recognition method based on the generalized maximummargin discriminant analysis (GMMDA) method is proposed in this paper. GMMDA is a multi-class extension of our proposed two-class dimensionality reduction method based on maximum margin discriminant analysis (MMDA), which utilizes the normal direction of optimal hyperplane of linear support vector machine (SVM) as the pro...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Neurocomputing
دوره 73 شماره
صفحات -
تاریخ انتشار 2010